Ontological Dependency
نویسنده
چکیده
Successful ontological analysis depends upon having the right underlying theory. The work described here, exploring how to understand organisations as systems of social norms found that the familiar objectivist position did not work, eventually replacing it with a radically subjectivist ontology which treats every thing, relationship and attribute as a repertoire of behaviour as understood by some responsible agent. Gibson’s Theory of Affordances supports this view in relation to our physical reality and the concept of norms extends the theory naturally into the social domain. A formalism, Norma, which captures the need always to specify the responsible agent and some more or less complex repertoire of behaviour, introduces the concept of ontological dependency where one repertoire depends for its existence on another. This unusual logical relationship allows one to devise schemas which can generate systems as by-products; the paper ends with an example dealing with health insurance. To emphasise the validity of the underlying philosophical position, the paper is written in E-Prime (Borland 1974), English without the verb "to be" which forces one to abandon the objectivist way of thinking in favour of one that accounts for the world in terms of the actions of agents. The commitment we make concerning what kinds of things exist must be the deepest we make in forming a view of the world, so deep as a rule that it tends usually to remain implicit in what we say. Habitual ways of thinking and language itself can keep our ontological assumptions submerged in the unconscious and, by doing so, these influences can hold us prisoner to a set of ideas. This paper explains a radically subjectivist ontological position which we eventually adopted to replace the old objectivist view which did not work. We inteded our research on the problem of describing organisations accurately as information systems to lead to better ways of analysing, specifying and developing computer-based information systems. In this we have now succeeded but in many other surprising ways, in particular in developing a method of semantic analysis that appears capapable of generating a cannonical semantic form. The crucial ingredient in this method, the concept of ontological dependency, I shall explain in this paper. Ontological theory versus ontological engineering Our research into methods of specifying organisations as information systems made use of legal norms as experimental material. (Organised behaviour implies regularities caused by people tending to follow norms, hence to understand and specify organistion we could try to do so by treating norms as our primitive concept. No more of this here see Stamper 1994) Naturally we encountered the same problem as Cyc, and in much the same light as Lenat and Guha (1990, p.23): "Choosing a set of representation primitives (predicates, objects and functions) has been called ontological engineering that is, defining the categories and relationships of the domain. (This is empirical, experimental engineering, as contrasted with ontological theorizing, which philosophers have done for millenia.)" and we made significant progress, demonstrating an expert system shell at the workshop on computers and law in Swansea in 1979. But this attitude towards ontology will not work in the longrun for large human systems. If you want a machine to pass the Turing test for artificial intelligence, then you can build your own local solution for manipulating symbols in ways that will pull enough wool over the eyes of the individuals in the neighbouring room recruited to judge the machine/person undergoing the test. Turing’s test avoids the real difficulties of ontology (pure nominalism suffices in a world of character-strings), semantics (treated intuitively by the judge) and intentional behaviour (no serious commitments made) by limiting itself to desktop activities. In the world of practical human affairs where laws, organisational norms and culture matter, we exchange signs not as tokens on a kind of chess board but as instruments for action. I propose a tougher criterion for intelligence outside the laboratory or the school examination room: the subject and the judge must enter into serious social relationships in which the subject will have to discharge the commitments it accepts and account for its failures in a responsible way. The engineering of character-string manipulation in a laboratory would not suffice in for systems that must play in integral role in an organisation or society. Using the law as experimental material brought with it more than the benefits of an endless supply of complex but fairly clear norms to study; it forced us to accept some rather obvious premisses to which a training in natural science or mathematics tends to blind one. In an attempt to create a legally oriented language (Legol) that ordinary users would understand fairly easily we adopted as a quality goal the hiding of as much structure as possible, perhaps an arbitrary criterion but one that inadvertanty pointed us in the right direction. It soon became evident that every thing in the domain of everyday human affairs seems to involve time and also some responsible agent, because timelessness places things in a distant, abstract world, and we know nothing without involving someone at least as an observer. The central role of the agent began to emerge but our understanding of time had some way to go. The emergent structure owed not a little to an analogy with the notion of spatial coordinates but extended to a space of social as well as physical reality. Thus: to own a licence to publish this paper in the proceedings of a conference one had to find one’s way first to the literary work in question, then to the copyright in it, then to someone’s ownership of the copyright and the licence incorporating a limited portion of those rights, then to the ownership of the licence by someone else and to the inclusion of one literary work within another and to its publication; the licence in question makes no sense without all these other things making sense, just as a point in three dimensional space makes no sense without understanding how to move in each of the three dimensions. Clearly, at the origin of this elaborate, social coordinate system we find society itself acting as the agent. Progress was good and the quality criterion of having a uniform, hidable structure behind every thing grew into a goal of creating a semantic normal form. Given a cannonical structure to capture meanings, every analyst would contribute his or her share towards a grand, coherent structure and computer-based systems developed for different parts of an organisation would not remain island of automation, but would naturally link up to one another as their functions began to overlap. This goal did not seem unattainable. So, at that stage, everything mentioned in a norm had, besides a label (natural lanuage word or a name), a sort (the kind of universal it instantiated), optional start and finish events, an authority (an agent or a norm) and any number of antecedents one had to know about to understand the thing in question. We implemented a new interpreter for the Legol formalism (Tagg 1979) which had the nesting of antecedents on the basis of a paper (Stamper 1978) on the goal of a semantic normal form. The next step suggested itself: make the start and finish mandatory. Sociologists had not problem with the idea of the social construction of at least the social part of our reality (Berger and Luckmann, 1967); theologians might object but we could leave them to their own account; more troublesome were mathematicians who study an eternal reality but a number of writers (Lakatos 1976, Bloor 1976 and 1983, Davis and Hersh 1983, Kitcher 1984, De Millo et al 1979) made clear that one might argue for an underlying empirical and social foundation for mathematics, thus bringing its concepts under the rule that everything has a finite period of existence. This made it easier to impose on the predicates and functions the constraint that they could only exist during the coexistence of their immediate antecedents, but things of a social kind would not obey this constraint without making it difficult to handle such things as a tax liability for income in the previous year, certainly, in my experience when the income has all gone, the tax liability remains in existence. Time remained an enigma. But rescue came in the form of semiotics, the doctrine of signs, which makes clear the role that signs (information) play in the construction of our social reality. The blinding ontological insight that we can experience only the here-and-now and that we can only bind the present to the past using signs transformed the picture. As any practiced taxevader will tell you, the liablity only exists by virtue of the revenue service demanding payment on the basis of the record of one’s income; so, avoid the demand, or better still, rid the world of the record of income, and the liability vanishes. Henceforth we applied the cooexistence rule to all antecedents. The education we all receive in natural science and mathematics encourages us to believe in an objective reality composed of individuals and all kinds of set-theoretic structures build from them, and to which our words point when we ask for their meanings. We still held that view. Nevertheless our emerging theory had all the building bricks for constructing an ontology upon quite a different basis. Clearly we can know of no reality without involving an agent of some kind, ourselves or another person or group, even for purposes of establishing commonsense knowledge we must admit a group agent as large as society as a whole; indeed none of us would never have learned much about the world without assimilating the perceptual framework built up over centuries by society at large. This makes possible the leap from an objectivist paradigm to a radically subjectivist one which requires the recognition of a responsible agent behind all knowledge. The authority as a necessary attibute of everything removed from this step any technical problem: we had already embedded the solution in our schema. The difficulty of inventing individual instants or intervals in order to introduce time into an objectivist ontology had evaporated. everything incorporates time in its start and finish events which we can only know through the use of signs to represent them. The trouble with our excellent education in science and mathematics arises from our having come to believe it as a matter of commonsense: we forget its sophistication. Approaches to ontology, as in Cyc, which take for granted the spacetime continuum begin at the wrong end. Mankind has built up an understanding of reality in small steps that never had the assistance of Newton or Einstein. We handle time, therefore simply by referring as the events defined by an instance of anything to a sign which means something else during whose existence the event occurred. These interrelationships constitute what we understand by time; chronological time, the time of the physicist, we can then build upon this simple pre-historic invention. Everything we experience we experience here and now in a concrete way, including the signs that stand for other things, and from which we construct the past, future and distant things. Contrast this with the ontology of Cyc (op. cit. p.172) which admits the existence of all kinds of intangible objects but does not mention signs! In the pursuit of a cannonical semantic structure, this discipline of explaining the existence of everything in terms of what we can know directly in the hre-and-now, makes a major contribution. We invented modern physics as a most remarkable semiological structure and any good ontology should help us to account for how the science emerged from our prehistoric invention of signs (language and drawings) rather than the other way round. Arbitrariness still remained, especially in the easy-going acceptance of any number of antecedents. When analysing a problem under these lax conditions, people would advance an unhelpful diversity of solutions, so it seemed reasonable to adopt, at least for experimental purposes, the hypothesis that two antecedents would always suffice. It worked. In no case did the restriction prevent one finding a suitable analysis. But there remained the question of how to justify this constraint, which I can only answer with the following arguments: 1. From a mathematical point of view we can always transform a graph having nodes with three or more antecedents by adding dummy nodes to make it comply with the two-antecedent rule. 2. Occams’s Razor, "Entities are not to be multiplied without necessity." or "It is vain to do with more what can be done with fewer." suggests keeping the permitted number of antecedents to a minimum, as a feature of the ontological theory; the fact that the rule increases the number of elements in the ontology does not violate the Razor because these additions belong to the object of study rather than to the theory. 3. Popper (1972, p.81) conceived his refutationist method of scientific discovery as "a method of bold conjectures and ingenious and severe attempts to refute them." A bold conjecture, in his view, go against expectations, contradict earlier theories rather than vary them in minor, ad hoc ways, thus exposing itself far more to the risk of refutation than any pedestrian hypothesis, so that it wins greater credibility should refutation fail. Most people seem to think the two-antecedent rule counter-intuitive, and therefore a fairly bold conjecture. 4. The objection that introducing dummy nodes would only add irrelevant features to the analysis turned out to be false, in practice; in fact, of the ad hoc solutions that Popper despises, the additional nodes always resulted in finding significant semantic features that would otherwise have remained hidden behind a larger array of antecedents. 5. The strict two-antecedent rule greatly reduces the possibility of arbitrariness in the schema, thus making it less likely that it represents a work of imagination by the analyst rather than modelling the reality evolved by a community for the domain of action. Expressed differently: the rule makes it easier to subject a schema to critical analysis and more difficult to excuse any arbitrary solution. 6. In practice, the schemas produced have delivered an unusually high degree of robustness to changes in requirements, almost certainly as a result of eliminating arbitrariness which tends to leave instabilities lurking in the system. In short the principles of Occam and Popper make the theory more informative while experience shows that the schema more informative in practice. At this point we stood on the brink of switching from an objectivist to a subjectivist position but it seemed impossible to let go of the notion of a universe populated by individuals. What could take their place? If the individuals exist independently of the agents who observe and use them, then the agents only play an incidental and optional role. To ensure a central role for the agent, we must abandon our faith in self-standing individuals, but how. A debate on the psychology of perception indicated the way. Certainly, no world is perceived without an agent to do so and perceiving always implies some kind of behaviour, at least of a passive kind. For our purposes, we found that James Gibson had proposed the most relevant theory of perception. He worked on the perceptual problems of pilots during the second world war during which he created his Theory of Affordances. Gibson gave the agent a central, creative role in perception instead of the passive role of a receiver of sense data which the agent’s eyes and brain assemble into a perception of some object, first on the retina and then in the mind. The classical picture of visual perception shows rays of light radiating from the object, with the eye intercepting and sensing a few of them for the observer to interpret, as illustrated (Gibson 1979, p. 59) in Figure 1. FIGURE 1: Classical paradigm for perception ready-made objects being sensed Gibson’s theory replaces this classical diagram by a totally different one of an agent bombarded with signals from all directions, an agent swimming in a sea of information, an agent who perceives invariants in this information-loaded environment. Figure 2 (Gibson p. 72) shows a little of the optical environment sensed by an observer in two different positions. The agent also senses sounds and their echos from the walls of the room, smells, temperatures and, most importantly, his or her own movements. For its survival and well-being, the agent needs to recognise what it can do, so it must distinguish those valuable invariant repertoires of behaviour afforded by the agent-inits-environment. The crucial, theoretical ontological step forward we take by recognising that the agent need only perceive these affordances, as Gibson called these invariant repertoires of behaviour, as pathways for action. Indeed the perception of ready-made individuals begs the question of how to learn about their existence ready for the act of observing them, whereas the perception of affordances explains how we manage to introduce the notion of individuals into our universe. As Quine (1953 p. 44) in his essay "Two dogmas of empiricism" has expressed it: "The myth of physical objects . . . has proved more efficacious than other myths as a device for working a manageable structure into the flux of experience." Gibson has done us the service of looking behind the myth to the deeper ontological foundation on which we have chosen to build our theory. FIGURE 2: Gibson’s paradigm for the perception of invariants in the ambient information The visual perception of affordances which Gibson studied depends upon finding spatial, geometric invariants signalling a kind of behavioural repertoire. The movement towards or away from something illustrates this. Anyone walking but especially anyone driving a car towards a wall or flying a plane towards a runway depends for safety upon an ability to perceive, in the combination of visual field and kinaesthetic sensations, an affordance that includes the following invariants in the optical array: a central point towards which one moves; an array of closed curves moving out from that central point; at invariant speeds which depend upon their proximity to and angle from the centre and speed of approach, with additional invariance in the rate of change of these speeds for a given acceleration by the observer, see Figure 3 from Gibson (p. 125). The visual cues merely signal the opportunity of approaching or distancing oneself from something, of themselves they have little importance compared with the behaviour made possible or necessary by proximity to food or an enemy. FIGURE 3: Invariants in the visual field of a pilot landing an aeroplane This approach to perception introduces a remarkably unified ontology. Objects no longer exist as things that stand by themselves which have properties by themselves. A cup becomes an agent’s experience of a repertoire of affordances such as the ability to hold liquids in certain positions, the noise it makes in hitting various surfaces, the visual/tactile shape it displays, and so on. The entity-relationship-attribute confusion vanishes from this perspective. But we needed a still greater unification. Gibson limited his concerns to the perception of the physical world but the notion of an affordance generalises naturelly (Stamper 1985) to include the invariants that we perceive in our social world. If one has a copyright one possesses an invariant repertoire of rights, duties, liberties, immunities and others have their converse in any behaviour relating to that literary work. A cup has a number of social invariants in addition to its physical ones, for example it enables us to drink liquids in a manner acceptable in polite company and it can enter into the invariant of ownership, just as a copyright can. We create the invariants in our social world using norms, not only legal norms but informal and cultural norms. Hence I caled the formalism under construction, Norma, a (proto-)logic of norms and affordances.
منابع مشابه
Processing Continuous Queries on Sensor-Based Multimedia Data Streams by Multimedia Dependency Analysis and Ontological Filtering
We present a mathematical model of multimedia data streams and a framework for multimedia functional dependency analysis. The dual objectives are to effectively design multimedia data streams schema and to efficiently process continuous queries on sensor-based multimedia data streams. To further improve query processing, we introduce the concept of ontological filtering. A software tool to add ...
متن کاملUsing Lexical Dependency and Ontological Knowledge to Improve a Detailed Syntactic and Semantic Tagger of English
This paper presents a detailed study of the integration of knowledge from both dependency parses and hierarchical word ontologies into a maximum-entropy-based tagging model that simultaneously labels words with both syntax and semantics. Our findings show that information from both these sources can lead to strong improvements in overall system accuracy: dependency knowledge improved performanc...
متن کاملOntology Functional Dependencies
We extend traditional functional dependencies (FDs) for data quality purposes to accommodate ontological variations in the attribute values. We begin by formally defining a novel class of dependencies called ontological FDs, which strictly generalize traditional FDs by allowing differences controlled by an ontology database. The ontology databases contain information about synonyms. We then foc...
متن کاملAn Integrated Model of Semantic and Conceptual Interpretation from Dependency Structures
We propose a two-layered model for computing semantic and conceptual interpretations from dependency structures. Abstract interpretation schemata generate semantic interpretations of `minimal' dependency subgraphs, while production rules whose speci cation is rooted in ontological categories derive a canonical conceptual interpretation from semantic interpretation structures. Con gurational des...
متن کاملLanguage Understanding With Ontological Semantics
Ontological Semantics is an approach to automatically extracting the meaning of natural language texts. The OntoSem text analysis system, developed according to this approach, generates ontologically grounded, disambiguated text meaning representations that can serve as input to intelligent agent reasoning. This article focuses on two core subtasks of overall semantic analysis: lexical disambig...
متن کاملAn ontological approach for reliable data integration in the industrial domain
Ontologies are structural components of modern information systems. The taxonomy, the core of an ontology, is a delicate balance between adequacy considerations, minimal commitments and implementation concerns. However, ontological taxonomies can be quite restrictive and entities that are commonly used in production and services might not find room in a official or de facto standard or ontologi...
متن کامل